Web Survey Bibliography
Title Fieldwork monitoring and managing with time-related paradata
Author Vandenplas, C.
Year 2017
Access date 18.09.2017
Abstract In this day and age, time is a critical element of anybody’s live, especially for the active population. The same holds for different facets of the survey process: time and timing are both linked to the survey costs and the data quality, two essential elements of a survey. During the data collection period, the available time of potential respondents play a key role in their decision to participate or not to the survey whilst the interviewer time, for telephone or face-to-face surveys, is an important factor in his/her capacity to recruit respondents. The timing of the visits, calls or sent-out of questionnaire/request and reminders has also been shown to be determining for survey participation. At the same time, requesting that interviewers work in the evening or at the weekend or making sure that the reminders to a Web or mail surveys are sent timely may have cost implications. Nonresponse error is not the only type of survey error to be linked to time: the time taken to answer a question, also called response latency, is known to echo the cognitive effort of the respondent and, hence, data quality. On the other hand, the interviewer speed can also influence data quality. Moreover the interviewer speed has been shown to be dependent of the rank of the interview.
In this presentation, we will give a few examples on how time-related paradata can be used to detect survey error and used to improve data quality in a fieldwork monitoring perspective. Using the data from the European Social survey, we will illustrate how the yield of the fieldwork per time unit that can be derived from the contact forms and the interview speed that can be derived from timers could guide us in decision making with the aim to improve data quality during the fieldwork.
In this presentation, we will give a few examples on how time-related paradata can be used to detect survey error and used to improve data quality in a fieldwork monitoring perspective. Using the data from the European Social survey, we will illustrate how the yield of the fieldwork per time unit that can be derived from the contact forms and the interview speed that can be derived from timers could guide us in decision making with the aim to improve data quality during the fieldwork.
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - European survey research associaton conference 2017, ESRA, Lisbon (26)
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- A test of sample matching using a pseudo-web sample; 2017; Chatrchi, G., Gambino, J.
- A Partially Successful Attempt to Integrate a Web-Recruited Cohort into an Address-Based Sample; 2017; Kott, P. S., Farrelly, M., Kamyab, K.
- Nonprobability sampling as model construction; 2017; Mercer, A. W.